Search results for "Joint entropy"

showing 4 items of 4 documents

Event-based criteria in GT-STAF information indices: theory, exploratory diversity analysis and QSPR applications

2012

Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subg…

Quantitative structure–activity relationshipEntropyChemistry OrganicInformation TheoryQuantitative Structure-Activity RelationshipBioengineeringInformation theoryJoint entropyMolecular descriptorDrug DiscoveryComputer GraphicsCluster AnalysisEntropy (information theory)QuantumMathematicsDiscrete mathematicsMolecular StructureLinear modelComputational BiologyGeneral MedicineEthylenesModels TheoreticalLinear ModelsMolecular MedicineSubstructureHydrophobic and Hydrophilic InteractionsAlgorithmsSoftwareSAR and QSAR in Environmental Research
researchProduct

Relations frequency hypermatrices in mutual, conditional and joint entropy-based information indices.

2012

Graph-theoretic matrix representations constitute the most popular and significant source of topological molecular descriptors (MDs). Recently, we have introduced a novel matrix representation, named the duplex relations frequency matrix, F, derived from the generalization of an incidence matrix whose row entries are connected subgraphs of a given molecular graph G. Using this matrix, a series of information indices (IFIs) were proposed. In this report, an extension of F is presented, introducing for the first time the concept of a hypermatrix in graph-theoretic chemistry. The hypermatrix representation explores the n-tuple participation frequencies of vertices in a set of connected subgrap…

Thermodynamic stateEntropyMatrix representationStatistical parameterIncidence matrixGeneral ChemistryEthylenesJoint entropyCombinatoricsComputational Mathematicschemistry.chemical_compoundMatrix (mathematics)chemistryModels ChemicalEntropy (information theory)Data MiningMolecular graphComputer SimulationMathematicsJournal of computational chemistry
researchProduct

Accelerating Causal Inference and Feature Selection Methods through G-Test Computation Reuse

2021

This article presents a novel and remarkably efficient method of computing the statistical G-test made possible by exploiting a connection with the fundamental elements of information theory: by writing the G statistic as a sum of joint entropy terms, its computation is decomposed into easily reusable partial results with no change in the resulting value. This method greatly improves the efficiency of applications that perform a series of G-tests on permutations of the same features, such as feature selection and causal inference applications because this decomposition allows for an intensive reuse of these partial results. The efficiency of this method is demonstrated by implementing it as…

Markov blanketMarkov blanketComputer sciencecomputation reuseConditional mutual informationComputationSciencePhysicsQC1-999QGeneral Physics and AstronomyContext (language use)Feature selectionInformation theoryAstrophysicsJoint entropyArticleG-testQB460-466feature selectionCausal inferencecausal inferenceAlgorithminformation theoryEntropy
researchProduct

Estimating the decomposition of predictive information in multivariate systems

2015

In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of co…

Statistics and ProbabilityComputer scienceEntropyTRANSFER ENTROPYStochastic ProcesseInformation Storage and RetrievalheartAPPROXIMATE ENTROPYMaximum entropy spectral estimationInformation theoryGRANGER CAUSALITYJoint entropyNonlinear DynamicMECHANISMSBinary entropy functionTheoreticalHeart RateModelsInformationSLEEP EEGStatisticsOSCILLATIONSTOOLEntropy (information theory)Multivariate AnalysiElectroencephalography; Entropy; Heart Rate; Information Storage and Retrieval; Linear Models; Nonlinear Dynamics; Sleep; Stochastic Processes; Models Theoretical; Multivariate AnalysisConditional entropyStochastic ProcessesHEART-RATE-VARIABILITYCOMPLEXITYConditional mutual informationBrainElectroencephalographyModels TheoreticalScience GeneralCondensed Matter PhysicscardiorespiratoryNonlinear DynamicsPHYSIOLOGICAL TIME-SERIESSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaMultivariate AnalysisLinear ModelsLinear ModelTransfer entropySleepAlgorithmStatistical and Nonlinear Physic
researchProduct